On Divergence-Power Inequalities

نویسندگان

چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On Rényi entropy power inequalities

This paper is a follow-up of a recent work by Bobkov and Chistyakov, obtaining some improved Rényi entropy power inequalities (R-EPIs) for sums of independent random vectors. The first improvement relies on the same bounding techniques used in the former work, while the second significant improvement relies on additional interesting properties from matrix theory. The improvements obtained by th...

متن کامل

Nested Inequalities Among Divergence Measures

In this paper we have considered an inequality having 11 divergence measures. Out of them three are logarithmic such as Jeffryes-Kullback-Leiber [4] [5] J-divergence. Burbea-Rao [1] Jensen-Shannon divergence and Taneja [7] arithmetic-geometric mean divergence. The other three are non-logarithmic such as Hellinger discrimination, symmetric χ−divergence, and triangular discrimination. Three more ...

متن کامل

Divergence operator and Poincaré inequalities on arbitrary bounded domains

Let Ω be an arbitrary bounded domain of Rn. We study the right invertibility of the divergence on Ω in weighted Lebesgue and Sobolev spaces on Ω, and relate this invertibility to a geometric characterization of Ω and to weighted Poincaré inequalities on Ω. We recover, in particular, well-known results on the right invertibility of the divergence in Sobolev spaces when Ω is Lipschitz or, more ge...

متن کامل

On a Symmetric Divergence Measure and Information Inequalities

A non-parametric symmetric measure of divergence which belongs to the family of Csiszár’s f -divergences is proposed. Its properties are studied and bounds in terms of some well known divergence measures obtained. An application to the mutual information is considered. A parametric measure of information is also derived from the suggested non-parametric measure. A numerical illustration to comp...

متن کامل

Refinement Inequalities among Symmetric Divergence Measures

There are three classical divergence measures in the literature on information theory and statistics, namely, Jeffryes-Kullback-Leiber’s J-divergence, Sibson-Burbea-Rao’s JensenShannon divegernce and Taneja’s arithemtic geometric mean divergence. These bear an interesting relationship among each other and are based on logarithmic expressions. The divergence measures like Hellinger discriminatio...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Transactions on Information Theory

سال: 2007

ISSN: 0018-9448

DOI: 10.1109/tit.2006.890715